Human vision is biased: We are good at identifying members of our own race or ethnicity, and by comparison, bad at identifying almost everyone else.214 Yet many agencies using face recognition believe that machine vision is immune to human bias. In the words of one Washington police department, face recognition simply “does not see race.”215
The reality is far more complicated. Studies of racial bias in face recognition algorithms are few and far between. The research that has been done, however, suggests that these systems do, in fact, show signs of bias. The most prominent study, co-authored by an FBI expert, found that several leading algorithms performed worse on African Americans, women, and young adults than on Caucasians, men, and older people, respectively.216 In interviews, we were surprised to find that two major face recognition companies did not test their algorithms for racial bias.217
Racial bias intrinsic to an algorithm may be compounded by outside factors. African Americans are disproportionately likely to come into contact with—and be arrested by—law enforcement.218 This means that police face recognition may be overused on the segment of the population on which it underperforms. It also means that African Americans will likely be overrepresented in mug shot-based face recognition databases. Finally, when algorithms search these databases, the task of selecting a final match is often left to humans, even though this may only add human bias back into the system.
- 214. See, e.g., Gustave A. Feingold, The Influence of Environment on Identification of Persons and Things, 5 J. of the Am. Inst. of Crim. L. & Criminology 39, 50 (May 1914-March 1915) (“Now it is well known that, other things being equal, individuals of a given race are distinguishable from each other in proportion to our familiarity, to our contact with the race as a whole.”); Luca Vizioli, Guillaume A. Rousselet, Roberto Caldara, Neural Repetition Suppression to Identity is Abolished by Other-Race Faces, 107 Proc. of the Nat’l Acad. of Sci. of the U.S., 20081, 20081 (2010), http://www.pnas.org/content/107/46/20081.abstract. This problem is known as the “other-race” effect. Id.
- 215. See Seattle Police Department, Booking Photo Comparison System FAQs, Document p. 009377. In 2009, Scott McCallum then-systems analyst for the Pinellas County Sheriff’s Office face recognition system, made the same claim to the Tampa Bay Times. “[The software] is oblivious to things like a person’s hairstyle, gender, race or age, McCallum said.” Kameel Stanley, Face recognition technology proving effective for Pinellas deputies, Tampa Bay Times, July 17, 2009, http://www.tampabay.com/news/publicsafety/crime/facial-recognition-technology-proving-effective-for-pinellas-deputies/1019492.
- 216. See Brendan F. Klare et al., Face Recognition Performance: Role of Demographic Information, 7 IEEE Transactions on Information Forensics and Security 1789, 1797 (2012) (hereinafter “Klare et al.”).
- 217. See Interview with Face Recognition Company Engineer (Anonymous) (Mar. 9, 2016) (notes on file with authors); Interview with Face Recognition Company Engineer (Anonymous) (Mar. 16, 2016) (notes on file with authors).
- 218. See, e.g., Brad Heath, Racial Gap in U.S. Arrest Rates: ‘Staggering Disparity’, USA Today, Nov. 19, 2014, http://www.usatoday.com/story/news/nation/2014/11/18/ferguson-black-arrest-rates/19043207.